Training Iterative Collective Classifiers with Back-Propagation
نویسندگان
چکیده
We propose a new method for training iterative collective classifiers for labeling nodes in network data. The iterative classification algorithm (ICA) is a canonical method for incorporating relational information into the classification process. Yet, existing methods for training ICA models rely on computing relational features using the true labels of the nodes. This method introduces a bias that is inconsistent with the actual prediction algorithm. In this paper, we introduce a variant of ICA, ICA with back-propagation (BPICA) as a procedure analogous to recurrent neural network prediction, which enables gradient-based strategies for optimizing over model parameters. We demonstrate that by training BPICA, we more directly optimize the training loss of collective classification, which translates to improved accuracy and robustness on real network data. This robustness enables effective collective classification in settings where local classification is very noisy, settings that previously were particularly challenging for ICA and variants.
منابع مشابه
Practical Characteristics of Neural Network and Conventional Pattern Classifiers on Artificial and Speech Problems
Richard P. Lippmann Lincoln Laboratory, MIT Room B-349 Lexington, MA 02173-9108 Eight neural net and conventional pattern classifiers (Bayesianunimodal Gaussian, k-nearest neighbor, standard back-propagation, adaptive-stepsize back-propagation, hypersphere, feature-map, learning vector quantizer, and binary decision tree) were implemented on a serial computer and compared using two speech recog...
متن کاملPerformance comparison of three artificial neural network methods for classification of electroencephalograph signals of five mental tasks
In this paper, performance of three classifiers for classification of five mental tasks were investigated. Wavelet Packet Transform (WPT) was used for feature extraction of the relevant frequency bands from raw Electroencephalograph (EEG) signal. The three classifiers namely used were Multilayer Back propagation Neural Network, Support Vector Machine and Radial Basis Function Neural Network. In...
متن کاملFast opposite weight learning rules with application in breast cancer diagnosis
Classification of breast abnormalities such as masses is a challenging task for radiologists. Computer-aided Diagnosis (CADx) technology may enhance the performance of radiologists by assisting them in classifying patterns into benign and malignant categories. Although Neural Networks (NN) such as Multilayer Perceptron (MLP) have drawbacks, namely long training times, a considerable number of C...
متن کاملRecurrent Collective Classification
We propose a new method for training iterative collective classifiers for labeling nodes in network data. The iterative classification algorithm (ICA) is a canonical method for incorporating relational information into classification. Yet, existing methods for training ICA models rely on the assumption that relational features reflect the true labels of the nodes. This unrealistic assumption in...
متن کاملNeural Net and Traditional Classifiers
Previous work on nets with continuous-valued inputs led to generative procedures to construct convex decision regions with two-layer perceptrons (one hidden layer) and arbitrary decision regions with three-layer perceptrons (two hidden layers). Here we demonstrate that two-layer perceptron classifiers trained with back propagation can form both convex and disjoint decision regions. Such classif...
متن کامل